Skip to content

feat: add Chartmetric proxy endpoint with credit deduction#318

Open
sweetmantech wants to merge 3 commits intotestfrom
feature/chartmetric-proxy
Open

feat: add Chartmetric proxy endpoint with credit deduction#318
sweetmantech wants to merge 3 commits intotestfrom
feature/chartmetric-proxy

Conversation

@sweetmantech
Copy link
Copy Markdown
Contributor

@sweetmantech sweetmantech commented Mar 20, 2026

Summary

  • Adds GET/POST /api/chartmetric/[...path] proxy that forwards requests to Chartmetric
  • Authenticates callers via validateAuthContext (API key or Bearer token)
  • Deducts 5 credits per call via deductCredits
  • Exchanges server-side CHARTMETRIC_REFRESH_TOKEN for short-lived access token (key never exposed to clients)
  • Adds lib/chartmetric/getChartmetricToken.ts and lib/chartmetric/proxyChartmetricRequest.ts
  • 5 vitest tests covering 401, 402, 500, GET proxy, and POST proxy paths

Test plan

  • pnpm test lib/chartmetric — all 5 tests pass
  • CHARTMETRIC_REFRESH_TOKEN env var set in Vercel/production
  • Proxy call deducts 5 credits from caller account

🤖 Generated with Claude Code

Summary by CodeRabbit

  • New Features
    • Added a new API endpoint that forwards requests to an external service with dynamic (no-cache) behavior
    • Implements per-request credit deduction to enforce usage limits
    • Performs secure server-side token retrieval for upstream authentication
    • Returns CORS-compatible proxied responses with robust error handling and status propagation

Recoup Agent and others added 2 commits March 20, 2026 00:06
Implements POST/GET /api/chartmetric/[...path] that authenticates via
validateAuthContext, deducts 1 credit per call, exchanges the server-side
CHARTMETRIC_REFRESH_TOKEN for an access token, and forwards the request
to the Chartmetric API. Includes full vitest test coverage (5 tests).

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
Chartmetric costs $350/month flat. At 5 credits ($0.05/call), we break
even at ~7,000 calls/month (~233/day). A typical research task (6-7
API calls) costs 30-35 credits ($0.30-0.35), which is fair pricing for
the data value delivered. At 1 credit/call we would need 35,000 calls/
month to break even — unrealistic at current scale.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
@vercel
Copy link
Copy Markdown
Contributor

vercel bot commented Mar 20, 2026

The latest updates on your projects. Learn more about Vercel for GitHub.

Project Deployment Actions Updated (UTC)
recoup-api Ready Ready Preview Mar 20, 2026 1:32pm

Request Review

@coderabbitai
Copy link
Copy Markdown

coderabbitai bot commented Mar 20, 2026

📝 Walkthrough

Walkthrough

Adds a Chartmetric API proxy: a catch-all Next.js route forwards GET/POST to a proxy that authenticates callers, deducts credits, obtains a server-side Chartmetric token, and forwards requests to the Chartmetric API with CORS-enabled responses and dynamic runtime settings.

Changes

Cohort / File(s) Summary
API Route Handler
app/api/chartmetric/[...path]/route.ts
New catch‑all Next.js route exporting GET and POST handlers that await path params and forward requests to proxyChartmetricRequest. Exports runtime flags: dynamic = "force-dynamic", fetchCache = "force-no-store", revalidate = 0.
Chartmetric Integration
lib/chartmetric/getChartmetricToken.ts, lib/chartmetric/proxyChartmetricRequest.ts
New getChartmetricToken() exchanges a refresh token for an access token. proxyChartmetricRequest() validates auth, deducts credits, obtains access token, forwards client request to Chartmetric (preserving method, query, and optional body), and returns upstream JSON with CORS and proper status; handles auth, credit, token, fetch, and parsing errors.

Sequence Diagram

sequenceDiagram
    actor Client
    participant RouteHandler as Next.js Route<br/>Handler
    participant ProxyFn as proxyChartmetricRequest
    participant AuthSys as validateAuthContext
    participant CreditSys as deductCredits
    participant TokenFn as getChartmetricToken
    participant ChartmetricAPI as Chartmetric API

    Client->>RouteHandler: GET/POST /api/chartmetric/...
    RouteHandler->>ProxyFn: forward request + path params
    ProxyFn->>AuthSys: validate caller
    alt auth fails
        AuthSys-->>ProxyFn: NextResponse (error)
        ProxyFn-->>Client: return auth error
    else auth succeeds
        AuthSys-->>ProxyFn: caller identity
        ProxyFn->>CreditSys: deduct credits (5)
        alt insufficient credits
            CreditSys-->>ProxyFn: error (402)
            ProxyFn-->>Client: CORS 402 response
        else credits deducted
            CreditSys-->>ProxyFn: success
            ProxyFn->>TokenFn: request access token
            alt token error
                TokenFn-->>ProxyFn: error
                ProxyFn-->>Client: CORS 500 response
            else token received
                TokenFn-->>ProxyFn: access_token
                ProxyFn->>ChartmetricAPI: fetch(upstream URL, Bearer token, body?)
                ChartmetricAPI-->>ProxyFn: response JSON + status
                ProxyFn-->>Client: CORS response with upstream status
            end
        end
    end
Loading

Estimated Code Review Effort

🎯 3 (Moderate) | ⏱️ ~20 minutes

Poem

Tokens traded in the silent stream,
Credits counted, rules redeemed,
A route that listens, then relays,
Chartmetric whispers, proxy plays,
Requests flow home in guarded beams.

🚥 Pre-merge checks | ❌ 1

❌ Failed checks (1 warning)

Check name Status Explanation Resolution
Solid & Clean Code ⚠️ Warning The proxyChartmetricRequest function violates SRP by handling 8+ distinct responsibilities (auth, credit deduction, token retrieval, URL construction, request body handling, proxying, parsing, CORS headers) in 105 lines, creating a 'god function' that is difficult to test, maintain, and extend. Extract concerns into separate utility functions: createChartmetricErrorResponse() for CORS error responses, constructChartmetricUrl() for URL building, separate credit deduction wrapper, and dedicated body reading utility. Add Zod validation in route.ts for context.params and move magic numbers to configuration constants.

✏️ Tip: You can configure your own custom pre-merge checks in the settings.

✨ Finishing Touches
📝 Generate docstrings
  • Create stacked PR
  • Commit on current branch
🧪 Generate unit tests (beta)
  • Create PR with unit tests
  • Commit unit tests in branch feature/chartmetric-proxy
📝 Coding Plan
  • Generate coding plan for human review comments

Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share

Comment @coderabbitai help to get the list of available commands and usage tips.

Tip

You can make CodeRabbit's review stricter and more nitpicky using the `assertive` profile, if that's what you prefer.

Change the reviews.profile setting to assertive to make CodeRabbit's nitpick more issues in your PRs.

Copy link
Copy Markdown

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 2

🧹 Nitpick comments (3)
app/api/chartmetric/[...path]/route.ts (1)

16-34: Consider adding an OPTIONS handler for CORS preflight.

Browser-based clients making cross-origin requests will send a preflight OPTIONS request. While the responses include CORS headers via getCorsHeaders(), there's no explicit OPTIONS handler to respond to preflight requests.

If this API is intended for server-to-server use only, this is fine. But if browser clients will call it directly, an OPTIONS handler is needed.

♻️ Add OPTIONS handler if browser clients are expected
import { getCorsHeaders } from "@/lib/networking/getCorsHeaders";
import { NextResponse } from "next/server";

/**
 * OPTIONS /api/chartmetric/[...path]
 *
 * Handles CORS preflight requests.
 */
export async function OPTIONS() {
  return new NextResponse(null, { status: 204, headers: getCorsHeaders() });
}
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@app/api/chartmetric/`[...path]/route.ts around lines 16 - 34, Add an OPTIONS
handler to this route to respond to CORS preflight requests: export an async
function OPTIONS() that returns a 204 NextResponse with headers from
getCorsHeaders(), and import getCorsHeaders and NextResponse at the top; place
it alongside the existing GET/POST handlers (which call proxyChartmetricRequest)
so browser clients receive proper CORS preflight responses.
lib/chartmetric/proxyChartmetricRequest.ts (2)

92-97: Handle non-JSON upstream responses gracefully.

If Chartmetric returns a non-JSON response (e.g., an HTML error page during outages), chartmetricResponse.json() will throw. While the catch block handles this, the error message will be generic. Consider checking Content-Type or wrapping the JSON parse more explicitly.

♻️ Suggested improvement
   try {
     const chartmetricResponse = await fetch(chartmetricUrl, {
       method: request.method,
       headers: {
         Authorization: `Bearer ${accessToken}`,
         "Content-Type": "application/json",
       },
       ...(body ? { body } : {}),
     });

-    const data = await chartmetricResponse.json();
+    const contentType = chartmetricResponse.headers.get("content-type") ?? "";
+    if (!contentType.includes("application/json")) {
+      const text = await chartmetricResponse.text();
+      return NextResponse.json(
+        { status: "error", error: `Chartmetric returned non-JSON response: ${text.slice(0, 200)}` },
+        { status: 502, headers: getCorsHeaders() },
+      );
+    }
+    const data = await chartmetricResponse.json();

     return NextResponse.json(data, {
       status: chartmetricResponse.status,
       headers: getCorsHeaders(),
     });
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@lib/chartmetric/proxyChartmetricRequest.ts` around lines 92 - 97, The code
currently calls chartmetricResponse.json() which will throw on non-JSON
responses; update proxyChartmetricRequest to first inspect
chartmetricResponse.headers.get('content-type') (or attempt a safe JSON parse)
and only call .json() when content-type includes 'application/json'; otherwise
call chartmetricResponse.text() and return that body (or a structured fallback
error) via NextResponse.json/NextResponse.text with the original
chartmetricResponse.status and getCorsHeaders(); ensure chartmetricResponse and
getCorsHeaders are the referenced symbols used and keep existing status/headers
behavior for all branches.

38-46: String-based error detection is fragile coupling.

The check message.toLowerCase().includes("insufficient credits") couples this code to the exact wording of deductCredits's error message. If that message changes, the 402 response breaks silently and becomes a 500.

Consider having deductCredits throw a typed error (or return a discriminated result) so consumers can reliably detect insufficient credits without string parsing.

♻️ Example: Use a custom error class

In lib/credits/deductCredits.ts:

export class InsufficientCreditsError extends Error {
  constructor(required: number, available: number) {
    super(`Insufficient credits. Required: ${required}, Available: ${available}`);
    this.name = "InsufficientCreditsError";
  }
}

Then in proxyChartmetricRequest.ts:

-  } catch (err) {
-    const message = err instanceof Error ? err.message : String(err);
-
-    if (message.toLowerCase().includes("insufficient credits")) {
+  } catch (err) {
+    if (err instanceof InsufficientCreditsError) {
       return NextResponse.json(
         { status: "error", error: "Insufficient credits for Chartmetric API call" },
         { status: 402, headers: getCorsHeaders() },
       );
     }
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@lib/chartmetric/proxyChartmetricRequest.ts` around lines 38 - 46, The catch
block in proxyChartmetricRequest.ts currently detects insufficient credits by
string-parsing the error message which is fragile; update deductCredits to throw
a dedicated error type (e.g., export class InsufficientCreditsError extends
Error or return a discriminated result) and then change the catch in
proxyChartmetricRequest (and any other consumers) to detect that case via
instanceof InsufficientCreditsError (or by checking the discriminant) and return
the 402 NextResponse.json with getCorsHeaders(); otherwise rethrow or handle
other errors normally.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.

Inline comments:
In `@app/api/chartmetric/`[...path]/route.ts:
- Around line 4-15: Update the JSDoc comments for both the GET and POST handlers
in route.ts to reflect the actual credit cost (change "Deducts 1 credit per
call" to "Deducts 5 credits per call") so they match the implementation in
proxyChartmetricRequest; locate the top-of-file comment blocks above the GET
handler and the POST handler and edit the line mentioning credits to state 5
credits instead of 1.

In `@lib/chartmetric/proxyChartmetricRequest.ts`:
- Around line 36-52: Credits are being deducted before the Chartmetric upstream
call (deductCredits is called pre-request) which charges users even if the
upstream request fails; to fix, move the deductCredits call to after a
successful upstream response (i.e., call deductCredits only when the Chartmetric
request returns a 2xx and data is returned) or, if you prefer to keep
pre-charging, implement a refund path by calling a refund function (e.g.,
refundCredits or addCredits with creditsToRefund: 5) inside the
catch/failed-response handling so failed Chartmetric requests trigger a refund,
and update the NextResponse.json error branches to invoke that refund call
before returning (refer to deductCredits, the upstream Chartmetric request
handler, and existing error response paths using NextResponse.json and
getCorsHeaders).

---

Nitpick comments:
In `@app/api/chartmetric/`[...path]/route.ts:
- Around line 16-34: Add an OPTIONS handler to this route to respond to CORS
preflight requests: export an async function OPTIONS() that returns a 204
NextResponse with headers from getCorsHeaders(), and import getCorsHeaders and
NextResponse at the top; place it alongside the existing GET/POST handlers
(which call proxyChartmetricRequest) so browser clients receive proper CORS
preflight responses.

In `@lib/chartmetric/proxyChartmetricRequest.ts`:
- Around line 92-97: The code currently calls chartmetricResponse.json() which
will throw on non-JSON responses; update proxyChartmetricRequest to first
inspect chartmetricResponse.headers.get('content-type') (or attempt a safe JSON
parse) and only call .json() when content-type includes 'application/json';
otherwise call chartmetricResponse.text() and return that body (or a structured
fallback error) via NextResponse.json/NextResponse.text with the original
chartmetricResponse.status and getCorsHeaders(); ensure chartmetricResponse and
getCorsHeaders are the referenced symbols used and keep existing status/headers
behavior for all branches.
- Around line 38-46: The catch block in proxyChartmetricRequest.ts currently
detects insufficient credits by string-parsing the error message which is
fragile; update deductCredits to throw a dedicated error type (e.g., export
class InsufficientCreditsError extends Error or return a discriminated result)
and then change the catch in proxyChartmetricRequest (and any other consumers)
to detect that case via instanceof InsufficientCreditsError (or by checking the
discriminant) and return the 402 NextResponse.json with getCorsHeaders();
otherwise rethrow or handle other errors normally.

ℹ️ Review info
⚙️ Run configuration

Configuration used: Path: .coderabbit.yaml

Review profile: CHILL

Plan: Pro

Run ID: ffa7e8e1-ca9c-433e-bbae-b30f6fc3e14a

📥 Commits

Reviewing files that changed from the base of the PR and between 5dbcacf and 21ba61f.

⛔ Files ignored due to path filters (1)
  • lib/chartmetric/__tests__/proxyChartmetricRequest.test.ts is excluded by !**/*.test.*, !**/__tests__/** and included by lib/**
📒 Files selected for processing (3)
  • app/api/chartmetric/[...path]/route.ts
  • lib/chartmetric/getChartmetricToken.ts
  • lib/chartmetric/proxyChartmetricRequest.ts

Comment on lines +4 to +15
/**
* GET /api/chartmetric/[...path]
*
* Proxies GET requests to the Chartmetric API on behalf of an authenticated account.
* Deducts 1 credit per call.
*
* @param request - Incoming API request.
* @param context - Route context containing the Chartmetric path segments.
* @param context.params - Route params with Chartmetric path segments.
* @param context.params.path - Array of path segments to forward to Chartmetric.
* @returns The Chartmetric API response.
*/
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟡 Minor

JSDoc says 1 credit but implementation deducts 5 credits.

The documentation states "Deducts 1 credit per call" (lines 8 and 24), but proxyChartmetricRequest actually deducts 5 credits. This inconsistency will mislead API consumers.

📝 Update JSDoc to reflect actual credit cost
 /**
  * GET /api/chartmetric/[...path]
  *
  * Proxies GET requests to the Chartmetric API on behalf of an authenticated account.
- * Deducts 1 credit per call.
+ * Deducts 5 credits per call.
  *

Apply the same fix to the POST handler's JSDoc (line 24).

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@app/api/chartmetric/`[...path]/route.ts around lines 4 - 15, Update the JSDoc
comments for both the GET and POST handlers in route.ts to reflect the actual
credit cost (change "Deducts 1 credit per call" to "Deducts 5 credits per call")
so they match the implementation in proxyChartmetricRequest; locate the
top-of-file comment blocks above the GET handler and the POST handler and edit
the line mentioning credits to state 5 credits instead of 1.

Comment on lines +36 to +52
try {
await deductCredits({ accountId, creditsToDeduct: 5 });
} catch (err) {
const message = err instanceof Error ? err.message : String(err);

if (message.toLowerCase().includes("insufficient credits")) {
return NextResponse.json(
{ status: "error", error: "Insufficient credits for Chartmetric API call" },
{ status: 402, headers: getCorsHeaders() },
);
}

return NextResponse.json(
{ status: "error", error: "Failed to deduct credits" },
{ status: 500, headers: getCorsHeaders() },
);
}
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major

Credits deducted before confirming upstream success.

Currently, credits are deducted prior to the Chartmetric API call. If the upstream request fails (network error, 5xx, rate limit, etc.), the caller loses 5 credits without receiving data. This could lead to user frustration.

Consider one of these approaches:

  1. Deduct after success: Move credit deduction after a successful upstream response (risk: user could abort before deduction).
  2. Refund on failure: If the upstream call fails, issue a credit refund.
  3. Accept as-is: Document this behavior clearly so users understand they're charged per attempt, not per successful response.

If the current behavior is intentional (e.g., to cover costs of failed calls), it's worth documenting in the JSDoc or user-facing API docs.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@lib/chartmetric/proxyChartmetricRequest.ts` around lines 36 - 52, Credits are
being deducted before the Chartmetric upstream call (deductCredits is called
pre-request) which charges users even if the upstream request fails; to fix,
move the deductCredits call to after a successful upstream response (i.e., call
deductCredits only when the Chartmetric request returns a 2xx and data is
returned) or, if you prefer to keep pre-charging, implement a refund path by
calling a refund function (e.g., refundCredits or addCredits with
creditsToRefund: 5) inside the catch/failed-response handling so failed
Chartmetric requests trigger a refund, and update the NextResponse.json error
branches to invoke that refund call before returning (refer to deductCredits,
the upstream Chartmetric request handler, and existing error response paths
using NextResponse.json and getCorsHeaders).

…patibility

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
Copy link
Copy Markdown

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 1

♻️ Duplicate comments (1)
app/api/chartmetric/[...path]/route.ts (1)

8-8: ⚠️ Potential issue | 🟡 Minor

JSDoc credit cost is inconsistent with actual deduction.

Line 8 and Line 25 say 1 credit, but the proxy implementation deducts 5 credits. Please align both handler docs with runtime behavior.

📝 Suggested doc fix
- * Deducts 1 credit per call.
+ * Deducts 5 credits per call.

Also applies to: 25-25

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@app/api/chartmetric/`[...path]/route.ts at line 8, The JSDoc in
app/api/chartmetric/[...path]/route.ts incorrectly states "Deducts 1 credit per
call" while the proxy implementation deducts 5 credits; update the handler's
JSDoc comments (the top-of-file comment and the inline comment at line 25) to
state "Deducts 5 credits per call" so the documentation matches the runtime
behavior of the route handler in route.ts.
🧹 Nitpick comments (1)
app/api/chartmetric/[...path]/route.ts (1)

16-19: Extract shared GET/POST proxy flow into a single helper.

Line 16-19 and Line 33-36 duplicate the same params-await + proxy call path. A shared internal handler keeps this route simpler and easier to evolve.

As per coding guidelines: "**/*.{ts,tsx}: Extract shared logic into reusable utilities following Don't Repeat Yourself (DRY) principle."

Also applies to: 33-36

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@app/api/chartmetric/`[...path]/route.ts around lines 16 - 19, Extract the
duplicated params-await + proxy call into a small internal helper (e.g.,
handleChartmetricProxy) and have both exported GET and POST call it;
specifically, move the logic that awaits context.params and calls
proxyChartmetricRequest(request, params) into a single function (referencing
GET, POST, context.params, and proxyChartmetricRequest) and replace the
duplicated bodies in GET and POST with a call to that helper to keep the route
DRY and easier to evolve.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.

Inline comments:
In `@app/api/chartmetric/`[...path]/route.ts:
- Around line 16-18: The route handler GET is relying only on TypeScript types
for context.params (a runtime input) before calling proxyChartmetricRequest; add
a Zod schema (e.g., pathSchema = z.object({ path: z.array(z.string()) })) and a
validate helper that uses schema.safeParse to validate await context.params, and
if validation fails return a NextResponse with status 400 and a descriptive
error; once validated pass the parsed value to proxyChartmetricRequest (repeat
same validation pattern for the other handler around lines 33-35) so all runtime
route params are validated before proxying.

---

Duplicate comments:
In `@app/api/chartmetric/`[...path]/route.ts:
- Line 8: The JSDoc in app/api/chartmetric/[...path]/route.ts incorrectly states
"Deducts 1 credit per call" while the proxy implementation deducts 5 credits;
update the handler's JSDoc comments (the top-of-file comment and the inline
comment at line 25) to state "Deducts 5 credits per call" so the documentation
matches the runtime behavior of the route handler in route.ts.

---

Nitpick comments:
In `@app/api/chartmetric/`[...path]/route.ts:
- Around line 16-19: Extract the duplicated params-await + proxy call into a
small internal helper (e.g., handleChartmetricProxy) and have both exported GET
and POST call it; specifically, move the logic that awaits context.params and
calls proxyChartmetricRequest(request, params) into a single function
(referencing GET, POST, context.params, and proxyChartmetricRequest) and replace
the duplicated bodies in GET and POST with a call to that helper to keep the
route DRY and easier to evolve.

ℹ️ Review info
⚙️ Run configuration

Configuration used: Path: .coderabbit.yaml

Review profile: CHILL

Plan: Pro

Run ID: 72e5ab29-cc6d-47bc-8be6-b19b89c7e43d

📥 Commits

Reviewing files that changed from the base of the PR and between 21ba61f and 0df36b5.

📒 Files selected for processing (1)
  • app/api/chartmetric/[...path]/route.ts

Comment on lines +16 to +18
export async function GET(request: NextRequest, context: { params: Promise<{ path: string[] }> }) {
const params = await context.params;
return proxyChartmetricRequest(request, params);
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major

Add runtime Zod validation for route params before proxying.

Line 16 and Line 33 only use TypeScript typing; context.params is still unvalidated runtime input. Add a validate function (Zod + safeParse) and return 400 for invalid path payloads.

✅ Proposed validation pattern
-import { type NextRequest } from "next/server";
+import { type NextRequest, NextResponse } from "next/server";
+import { z } from "zod";
 import { proxyChartmetricRequest } from "@/lib/chartmetric/proxyChartmetricRequest";
+
+const chartmetricParamsSchema = z.object({
+  path: z.array(z.string().min(1)).min(1),
+});
+
+function validateChartmetricParams(input: unknown) {
+  const parsed = chartmetricParamsSchema.safeParse(input);
+  if (!parsed.success) {
+    return NextResponse.json(
+      { status: "error", error: "Invalid Chartmetric path" },
+      { status: 400 },
+    );
+  }
+  return parsed.data;
+}
@@
 export async function GET(request: NextRequest, context: { params: Promise<{ path: string[] }> }) {
-  const params = await context.params;
-  return proxyChartmetricRequest(request, params);
+  const validated = validateChartmetricParams(await context.params);
+  if (validated instanceof NextResponse) return validated;
+  return proxyChartmetricRequest(request, validated);
 }
@@
 export async function POST(request: NextRequest, context: { params: Promise<{ path: string[] }> }) {
-  const params = await context.params;
-  return proxyChartmetricRequest(request, params);
+  const validated = validateChartmetricParams(await context.params);
+  if (validated instanceof NextResponse) return validated;
+  return proxyChartmetricRequest(request, validated);
 }

As per coding guidelines: "All API endpoints should use a validate function for input parsing using Zod for schema validation."

Also applies to: 33-35

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@app/api/chartmetric/`[...path]/route.ts around lines 16 - 18, The route
handler GET is relying only on TypeScript types for context.params (a runtime
input) before calling proxyChartmetricRequest; add a Zod schema (e.g.,
pathSchema = z.object({ path: z.array(z.string()) })) and a validate helper that
uses schema.safeParse to validate await context.params, and if validation fails
return a NextResponse with status 400 and a descriptive error; once validated
pass the parsed value to proxyChartmetricRequest (repeat same validation pattern
for the other handler around lines 33-35) so all runtime route params are
validated before proxying.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant